Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

async tp allreduce #7115

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open

async tp allreduce #7115

wants to merge 7 commits into from

Conversation

inkcherry
Copy link
Contributor

@inkcherry inkcherry commented Mar 7, 2025

No description provided.

Copy link
Contributor

@hwchen2017 hwchen2017 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the format error by running pre-commit and DCO error.

@inkcherry
Copy link
Contributor Author

Please fix the format error by running pre-commit and DCO error.

fixed.

inkcherry and others added 6 commits March 17, 2025 11:02
Signed-off-by: inkcherry <[email protected]>
…i#7135)

Copy changes from deepspeedai/DeepSpeed-MII#558.
Fixes issue where docs still referenced CLA.

---------

Signed-off-by: Logan Adams <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Fix deepspeedai#7132

Signed-off-by: Olatunji Ruwase <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Keeps lines within PEP 8 length limits.
Enhances readability with a single, concise expression.
Preserves original functionality.

---------

Signed-off-by: Shaik Raza Sikander <[email protected]>
Signed-off-by: Olatunji Ruwase <[email protected]>
Signed-off-by: Max Kovalenko <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Signed-off-by: shaomin <[email protected]>
Signed-off-by: Stas Bekman <[email protected]>
Signed-off-by: siqi <[email protected]>
Signed-off-by: Logan Adams <[email protected]>
Signed-off-by: Wei Wu <[email protected]>
Signed-off-by: ShellyNR <[email protected]>
Signed-off-by: Lai, Yejing <[email protected]>
Signed-off-by: Hongwei <[email protected]>
Signed-off-by: Liang Cheng <[email protected]>
Signed-off-by: A-transformer <[email protected]>
Co-authored-by: Raza Sikander <[email protected]>
Co-authored-by: Olatunji Ruwase <[email protected]>
Co-authored-by: Max Kovalenko <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
Co-authored-by: inkcherry <[email protected]>
Co-authored-by: wukong1992 <[email protected]>
Co-authored-by: shaomin <[email protected]>
Co-authored-by: Hongwei Chen <[email protected]>
Co-authored-by: loadams <[email protected]>
Co-authored-by: Stas Bekman <[email protected]>
Co-authored-by: siqi654321 <[email protected]>
Co-authored-by: siqi <[email protected]>
Co-authored-by: Wei Wu <[email protected]>
Co-authored-by: Masahiro Tanaka <[email protected]>
Co-authored-by: Shelly Nahir <[email protected]>
Co-authored-by: snahir <[email protected]>
Co-authored-by: Yejing-Lai <[email protected]>
Co-authored-by: A-transformer <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Unpin transformers version for all workflows except
`nv-torch-latest-v100` as this still has a tolerance issue with some
quantization tests.

Signed-off-by: Logan Adams <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants